Exponential Graph Regularized Non-Negative Low-Rank Factorization for Robust Latent Representation

نویسندگان

چکیده

Non-negative matrix factorization (NMF) is a fundamental theory that has received much attention and widely used in image engineering, pattern recognition other fields. However, the classical NMF limitations such as only focusing on local information, sensitivity to noise small sample size (SSS) problems. Therefore, how develop improve performance robustness of algorithm worthy challenge. Based bottlenecks above, we propose an exponential graph regularization non-negative low-rank (EGNLRF) combining sparseness, low rank exponential. Firstly, based assumption data corroded, decompose given raw item with error fitting matrix, applying constraint denoising data. Then, perform resulting from which derive low-dimensional representation original matrix. Finally, use for embedding maintain geometry between samples. The terms are exponentiated cope SSS problems nearest neighbor sensitivity. above three steps will be incorporated into joint framework validate optimize each other; therefore, can learn latent representations undisturbed by preserve structure known We conducted simulation experiments different datasets verified effectiveness comparing proposed lasting ones related NMF, embedding.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Rank-one Residue Approximation Method for Graph Regularized Non-negative Matrix Factorization

Nonnegative matrix factorization (NMF) aims to decompose a given data matrix X into the product of two lower-rank nonnegative factor matrices UV T . Graph regularized NMF (GNMF) is a recently proposed NMF method that preserves the geometric structure of X during such decomposition. Although GNMF has been widely used in computer vision and data mining, its multiplicative update rule (MUR) based ...

متن کامل

Graph Regularized Low Rank Representation for Aerosol Optical Depth Retrieval

In this paper, we propose a novel data-driven regression model for aerosol optical depth (AOD) retrieval. First, we adopt a low rank representation (LRR) model to learn a powerful representation of the spectral response. Then, graph regularization is incorporated into the LRR model to capture the local structure information and the nonlinear property of the remote-sensing data. Since it is easy...

متن کامل

Gene Feature Extraction Based on Nonnegative Dual Graph Regularized Latent Low-Rank Representation

Aiming at the problem of gene expression profile's high redundancy and heavy noise, a new feature extraction model based on nonnegative dual graph regularized latent low-rank representation (NNDGLLRR) is presented on the basis of latent low-rank representation (Lat-LRR). By introducing dual graph manifold regularized constraint, the NNDGLLRR can keep the internal spatial structure of the origin...

متن کامل

Graph Regularized Non-negative Matrix Factorization By Maximizing Correntropy

Non-negative matrix factorization (NMF) has proved effective in many clustering and classification tasks. The classic ways to measure the errors between the original and the reconstructed matrix are l2 distance or KullbackLeibler (KL) divergence. However, nonlinear cases are not properly handled when we use these error measures. As a consequence, alternative measures based on nonlinear kernels,...

متن کامل

Robust latent low rank representation for subspace clustering

Subspace clustering has found wide applications in machine learning, data mining, and computer vision. Latent Low Rank Representation (LatLRR) is one of the state-of-the-art methods for subspace clustering. However, its effectiveness is undermined by a recent discovery that the solution to the noiseless LatLRR model is non-unique. To remedy this issue, we propose choosing the sparest solution i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematics

سال: 2022

ISSN: ['2227-7390']

DOI: https://doi.org/10.3390/math10224314